A globally convergent hybrid conjugate gradient method with strong Wolfe conditions for unconstrained optimization
نویسندگان
چکیده
منابع مشابه
A new hybrid conjugate gradient algorithm for unconstrained optimization
In this paper, a new hybrid conjugate gradient algorithm is proposed for solving unconstrained optimization problems. This new method can generate sufficient descent directions unrelated to any line search. Moreover, the global convergence of the proposed method is proved under the Wolfe line search. Numerical experiments are also presented to show the efficiency of the proposed algorithm, espe...
متن کاملA Conjugate Gradient Method with Strong Wolfe-Powell Line Search for Unconstrained Optimization
In this paper, a modified conjugate gradient method is presented for solving large-scale unconstrained optimization problems, which possesses the sufficient descent property with Strong Wolfe-Powell line search. A global convergence result was proved when the (SWP) line search was used under some conditions. Computational results for a set consisting of 138 unconstrained optimization test probl...
متن کاملA New Hybrid Conjugate Gradient Method Based on Eigenvalue Analysis for Unconstrained Optimization Problems
In this paper, two extended three-term conjugate gradient methods based on the Liu-Storey ({tt LS}) conjugate gradient method are presented to solve unconstrained optimization problems. A remarkable property of the proposed methods is that the search direction always satisfies the sufficient descent condition independent of line search method, based on eigenvalue analysis. The globa...
متن کاملNew hybrid conjugate gradient method for unconstrained optimization
Conjugate gradient methods are widely used for unconstrained optimization, especially large scale problems. Most of conjugate gradient methods don’t always generate a descent search direction, so the descent condition is usually assumed in the analyses and implementations. Dai and Yuan (1999) proposed the conjugate gradient method which generates a descent direction at every iteration. Yabe and...
متن کاملGlobally convergent modified Perry's conjugate gradient method
Conjugate gradient methods are probably the most famous iterative methods for solving large scale optimization problems in scientific and engineering computation, characterized by the simplicity of their iteration and their low memory requirements. In this paper, we propose a new conjugate gradient method which is based on the MBFGS secant condition by modifying Perry’s method. Our proposed met...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Mathematical Sciences
سال: 2019
ISSN: 2008-1359,2251-7456
DOI: 10.1007/s40096-019-00310-y